Backpropagation generalized for output derivatives

نویسنده

  • V. I. Avrutskiy
چکیده

Backpropagation algorithm is the cornerstone for neural network analysis. Paper extends it for training any derivatives of neural network’s output with respect to its input. By the dint of it feedforward networks can be used to solve or verify solutions of partial or simple, linear or nonlinear differential equations. This method vastly differs from traditional ones like finite differences on a mesh. It contains no approximations, but rather an exact form of differential operators. Algorithm is built to train a feed forward network with any number of hidden layers and any kind of sufficiently smooth activation functions. It’s presented in a form of matrix-vector products so highly parallel implementation is readily possible. First part derives the method for 2D case with first and second order derivatives, second part extends it to N-dimensional case with any derivatives. All necessary expressions for using this method to solve most applied PDE can be found in Appendix D.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Modeling with Recurrent Neural Networks using Generalized Mean Neuron Model

Abstract This paper presents the use of generalized mean neuron model (GMN) in recurrent neural networks (RNNs). The GMN includes a new aggregation function based on the concept of generalized mean of all the inputs to the neuron. Learning is implemented on-line, based on input-output data using an alternative approach to recurrent backpropagation learning algorithm. The learning and generaliza...

متن کامل

Biologically Plausible Error-driven Learning using Local Activation Differences: The Generalized Recirculation Algorithm

The error backpropagation learning algorithm (BP) is generally considered biologically implausible because it does not use locally available, activation-based variables. A version of BP that can be computed locally using bi-directional activation recirculation (Hinton & McClelland, 1988) instead of backpropagated error derivatives is more biologically plausible. This paper presents a generalize...

متن کامل

Variational Principle for the Generalized KdV-Burgers Equation with Fractal Derivatives for Shallow Water Waves

The unsmooth boundary will greatly affect motion morphology of a shallow water wave, and a fractal space is introduced to establish a generalized KdV-Burgers equation with fractal derivatives. The semi-inverse method is used to establish a fractal variational formulation of the problem, which provides conservation laws in an energy form in the fractal space and possible solution structures of t...

متن کامل

Equivalence of Equilibrium Propagation and Recurrent Backpropagation

Recurrent Backpropagation and Equilibrium Propagation are algorithms for fixed point recurrent neural networks which differ in their second phase. In the first phase, both algorithms converge to a fixed point which corresponds to the configuration where the prediction is made. In the second phase, Recurrent Backpropagation computes error derivatives whereas Equilibrium Propagation relaxes to an...

متن کامل

Cephalometric analysis for finding facial growth abnormalities

Cephalometric analysis of lateral radiographs of the head is an important diagnosis tool in orthodontics. Based on physically locating specific landmarks, it is a boring, lengthy and error prone task. The objective of this work is to calculate the SNA angle, SNB angle and ANB angle between the landmarks to identify the input and output parameters pertaining to skeletal abnormalities. By doing s...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:
  • CoRR

دوره abs/1712.04185  شماره 

صفحات  -

تاریخ انتشار 2017